专利摘要:
system and method of operation for remotely operated vehicles with overlaid 3d image The present invention relates to a system and method of using overlaid 3d image for remotely operated vehicles, namely 3d, reconstructed images of the rov environment. in another aspect, this invention includes generating a virtual video of 3d elements in the operating environment, synchronizing the camera angle and position of a virtual video with the angle and position of a real camera, overlaying the virtual video and the real video from of the real camera; overlay these video feeds so that one is manipulated to show transparencies in areas of less interest, to show through the other video. the present invention may also include overlaying information, whether graphical, textual or both, on the virtual-real hybrid 3d image. the present invention is also networked, so that the visual immersion interface described above is accessible to a plurality of users operating from a plurality of locations.
公开号:BR112014011172B1
申请号:R112014011172-3
申请日:2012-11-08
公开日:2021-07-27
发明作者:Manuel Alberto Parente Da Silva
申请人:Abyssal S.A.;
IPC主号:
专利说明:

[001] The descriptions of the published patent documents referred to in this application are fully incorporated herein by reference to fully describe the state of the art to which this invention belongs.
[002] (The present invention relates to a three-dimensional ("3D") control and navigation system for remotely operated vehicles "ROV"), and methods for its use. Specifically, the present invention provides a navigation and control system that is standardized to be compatible with a wide range of ROV options. FUNDAMENTALS OF THE INVENTION
[003] The exploration of the last frontier on Earth, the sea, is largely triggered by the continuous demand for energy resources. As humans are unable to withstand the pressures induced in the depths where energy resonance occurs, they have come to rely more and more on ROV technology. The future of ocean exploration is only as fast, reliable and secure as available technology.
[004] The prior art related to the augmentation of the reality of navigation, such as Pre-Award Publication US 2011/0153189, describes systems for superimposing 3D objects and a video feed, but does not provide crucial devices to deal with the vicissitudes of the underwater navigation. However, efficient graphics systems used for underwater navigation, such as those described in U.S. Pre-grant Publication 2009/0040070 A1 and U.S. Patent 8,015,507 do not provide a navigation interface that creates an immersive visual experience for the pilot or user.
[005] A major shortcoming of available RV navigation technology is its inability to provide complete spatial awareness ie the ability to consistently know past and current flight path. Today's navigation systems rely on conventional telemetry information, including depth, pitch, roll, camera tilt and heading. However, the positioning systems that provide the geographic location of the ROV have not been fully integrated with the depth and guidance instruments that make up conventional telemetry systems.
[006] Another important aspect of underwater exploration is the acquisition and application of information related to the structures of the seabed and underwater. Modern multi-beam sonar devices with modeling software provide detailed 3D bathymetric data, which is essential for planning and evaluating exploration missions. This bathymetric data is used extensively by supervisors and representative customers in the energy resources industry. However, conventional systems do not integrate the use of bathymetric modeling with real-time navigation in a way that makes it easier for the ROV pilots themselves.
[007] As well as jet fighter pilots, ROV pilots need to navigate in three dimensions, in real time, and in conditions where visibility may be limited to between 2 to 10 meters. Therefore, both types of pilots need to receive fast, reliable and intelligent data in low visibility conditions. However, the dynamic user interfaces used for complex aviation missions, which cover quantitative environmental information, textual cloths and other important graphics were not made available for comparatively complex underwater missions.
[008] A key to establishing a successful operating system for ROV missions is to provide effective collaboration and communication between each person involved in the project. The ability to introduce new data and share data among system users, and particularly with the pilot, advantageously increases efficiency and safety.
[009] Therefore, it is necessary to increase the approach to pilot interactions - ROV in which information can be visualized and recorded in three spatial dimensions and in real time. SUMMARY OF THE INVENTION
[010] This description provides tools and resources that implement systems and methods related to the operation of ROV with navigation information and overlaid 3D image. Despite the modalities and examples provided in the context of underwater missions, one skilled in the art should appreciate that the aspects, features, functionality, etc., discussed in this description can also be extended to virtually any type of complex navigation project.
[011] In one aspect of this description, an operating and navigation system is provided to allow full integration between a wide variety of ROV control systems. That is, the invention enables engineers and supervisors to plan one or several missions in a common standard system that can be used for the operation and navigation of a wide variety of ROV pilots and operators.
[012] In another aspect, the invention provides a navigation system that visually presents any relevant geographic information related to the planned flight path, waypoints, checkpoints, workplaces and procedures provided by an operating system. It provides for data collection and transfer so that a user can archive procedures during operation, update the system when tasks are completed, and produce video and/or text based mission reports with the collected data. Therefore, the system provides fully up-to-date status information regarding mission progress and the position of the ROV.
[013] In a preferred embodiment, the invention includes computing hardware, and one or more display screens, sonar technology, and software for a graphical user interface, all of which can interact between one or more ROVs. The present invention also provides database and software for execution by computing hardware, including several modules to: obtain, save and model 3D elements in the ROV operating environment; synchronize a virtual camera to view the image modeled with the real camera of an ROV; provide a hybrid 3D image by overlaying real camera images and modeled images so that areas of high interest are more visible or opaque than areas of lesser interest; and superimposing overlay graphical or textual information on the hybrid 3D image. Additional software modules, which work with the user interface, are provided to plan, supervise, archive, share, and report on all aspects of mediated exploitation.
[014] By implementing the various tools and resources mentioned above and discussed in detail below, the present system can greatly improve the way subsea operations are conducted. For example, the invention will provide ROV operators with immersion operating systems to plan, supervise and report on mission. Improved visualization of the underwater environment and improved communication with supervisors and other pilots will lead to better overall efficiency, and less stress during long pilot shifts. The efficiency of data transfer and task completion created by the operating system will also bring additional benefits, such as operational cost reduction and consequent increase in throughput. Additionally, more efficient use of ROVs due to improved mission logistics will increase the period of use of these ROVs. Benefits for entities initiating and funding subsea research include increased access to more detailed reports, and even data acquired in real time, without having to visit the research site. These are just some of the benefits provided by the system. BRIEF DESCRIPTION OF THE DRAWINGS
[015] The above mentioned aspects and other aspects, characteristics and advantages can be better understood from the following detailed description with reference to the attached drawings, in which:
[016] Figure 1A illustrates a diagrammatic view of a system, according to an exemplary embodiment;
[017] Figure 1B illustrates a diagrammatic view of a system and its associated functions, according to another exemplary modality;
[018] Figures 2A and 2B depict alternative views of a user interface of a system according to another exemplary embodiment of the invention;
[019] Figures 3A and 3B illustrate software overviews of a system, in an exemplary modality;
[020] Figure 3c is a diagrammatic illustration of networked systems, in an exemplary modality;
[021] Figure 4 describes modules to achieve hybrid 3D imaging, and a method for its use, according to yet another embodiment of the invention;
[022] Figure 5A illustrates calculations to align a virtual video and a real video, according to an exemplary modality;
[023] Figure 5B illustrates a hybrid 3D image obtained by superimposing a virtual video and a real video, according to another exemplary modality; and
[024] Figures 6A to 6E describe views of a navigation interface, according to exemplary modalities. DETAILED DESCRIPTION OF THE INVENTION
[025] The invention provides systems for operating a remotely operated vehicle comprising: a) a database module of 3D elements representing objects arranged in an operating environment of said vehicle, the 3D elements comprising data acquired by the multiple sonar bundles; b) a virtual video generation module to generate a virtual video incorporating said 3D elements; c) a vehicle-mounted video camera to generate a real video of said vehicle's operating environment, d) a synchronization module to synchronize an angle and position of a virtual camera with an angle and position of the vehicle-mounted video camera , where the virtual camera defines a field of view for the virtual video; and an overlay module to overlay said virtual video and said real video, wherein said overlay module is configured to modulate the transparency or opacity of a region of less interest in one of the virtual or real videos such that a corresponding region of the another video is more visible, the so-called virtual and real videos comprising a hybrid 3D image.
[026] In an embodiment of the present invention, the overlay module is configured to overlay graphic information, textual information, or both, on the hybrid 3D image. The overlap can be based on a luminance threshold, where luminance in Red-Green-Blue hexadecimal format can be set to values between 0-0-0 and 255-255-255, and preferably between 0-0-0 and 4040-40.
[027] The invention also provides a system for underwater exploration comprising: a) a remote operated vehicle (ROV), wherein said ROV comprises a camera to acquire a real video; b) a network operating system comprising a computer and computer executable software, said network operating system comprising a display mechanism, wherein said display mechanism also comprises: i. a database module of 3D elements representing objects arranged in the operating environment of said vehicle; ii. a virtual video generation module for generating a virtual video incorporating said 3D elements; iii. a synchronization module for synchronizing an angle and position of a virtual camera with an angle and position of the vehicle-mounted video camera, wherein the virtual camera defines a field of view for the virtual video, said field of view extending about 45 to 144 degrees from a central point of view; and iv. a synchronization module for superimposing said virtual video and real video diyo, wherein said synchronization module is configured to modulate the transparency and opacity of a region of lesser interest of the virtual or real videos so that a corresponding region of the other video is more visible, said superimposed virtual and real video comprising a hybrid 3D image; and c) a navigation interface configured to display said hybrid 3D image, said navigation interface comprising at least one network monitor.
[028] In another embodiment of the invention, the operating system is capable of sharing data with a plurality of remote monitors, wherein said plurality of monitors can include up to 12, and preferably between 3 and 9 monitors.
[029] In yet another embodiment of the invention, the system also comprises an external system configured to determine whether the system is working or is in a fault state. The external system can be configured to switch the hybrid 3D image monitor input to the live video feed if the system is in a fault state.
[030] In yet another embodiment of the invention, the navigation interface comprises at least three network monitors, wherein said monitors are arranged adjacent to each other so that the intermediate monitor displays video and augmented reality, although the two side monitors display an expanded view of an operating field.
[031] In yet another embodiment of the invention, the navigation interface comprises one or more touch screens, one or more speakers to provide audio and audible warnings, one or more microphones to receive voice commands, one or more joysticks , one or more gamepads, and/or one or more computer mice.
[032] In yet another embodiment of the invention, the functions of the operating system are separated so that the operating system is compatible with a plurality of hardware options.
[033] The invention also provides a method of operating a remotely operated vehicle comprising: a) obtaining 3D bathymetry data using multi-beam sonar; b) storing 3D elements in a database module, said 3D elements representing objects arranged in the operating environment of the remotely operated vehicle and comprising said 3D elements comprising said 3D bathymetry data; c) generate a virtual video of said 3D elements; d) synchronizing an angle and position of a virtual camera with an angle and position of the vehicle mounted video camera, wherein the virtual camera defines a field of view for the virtual video; and e) overlaying said virtual video and real video, where the overlay also comprises the step of modulating the transparency and opacity of a region of less interest in the virtual and real video feeds so that a corresponding region of the other video is more visible, said superimposed virtual and real video feeds comprising hybrid 3D image.
[034] In a further embodiment, a method according to the invention comprises the step of overlaying graphical information, textual information, or both, on the hybrid 3D image.
[035] In yet another embodiment of the invention, the step of modulating the transparency or opacity also comprises stabilizing a luminance threshold. In an exemplary embodiment, setting the luminance threshold comprises keeping a virtual video background at a higher transparency than the real video background.
[036] In yet another embodiment of the invention, the synchronization of the angle and position of the cameras comprises the steps of: a) calculating an angle between a vehicle header and a direction of a real camera field of view. b) calculate an angle between a vehicle's vertical orientation and the direction of the actual camera's field of view; and c) calculate an angle between the vehicle and a geographic horizon.
[037] In a further modality, a method according to the invention also comprises the step of displaying said hybrid 3D image in a navigation interface. In an exemplary embodiment, the display step also comprises modulating the amount of 2D or 3D information superimposed on the video tele based on the position of the ROV with respect to the underwater structures. In another exemplary embodiment, the display step also comprises providing a mini-map, said mini-map defining a computer-generated graphic that illustrates either or both points or a position of an object in 3D.
[038] In yet another embodiment, a method according to the invention comprises analyzing the displayed information to determine the status of procedures.
[039] In yet another embodiment, a method according to the invention comprises updating the displayed information as tasks are completed.
[040] In yet another embodiment, a method according to the invention comprises determining the positioning of said 3D elements using a global positioning system.
[041] In yet another modality, a method according to the invention comprises the step of planning an underwater exploration mission, wherein said planning comprises entering user-determined information to display in the navigation interface, said user-determined information comprises any one or combination of bathymetry information, ROV point of view information, ROV checkpoint information, procedure information, timed procedure information, flight path information, 3D modeled elements, GPS-determined position information, and pilot information. In an exemplary modality, the method also comprises the step of configuring the operating system so that the entry, exit or stay longer than a designated time in a position determined by GPS triggers any one or combination of an alarm, notification, change of procedure or change of task.
[042] In yet another embodiment, a method according to the invention comprises the step of recording an underwater exploration mission, wherein said recording comprises recording any one or combination of telemetry data, sonar data, 3D models, data of bathymetry, flight path information, ROV viewpoint information, ROV checkpoint information, procedure information, positioning information, and inertia data. In an exemplary modality, the method also comprises reporting information that has been recorded. In an exemplary embodiment, the method also comprises saving information that has been registered to a networked database. In an exemplary embodiment, the method also comprises producing a report based on the saved information, said report providing a 3D recreation of the operation.
[043] The invention also provides a computer program product, stored on a computer-readable medium, to implement any method according to the invention as described herein.
[044] As mentioned above, several aspects and functionalities are discussed here by way of example and modalities in an ROV navigation context for use in underwater exploration. In describing such examples and exemplary embodiments, specific terminology is employed for the sake of clarity. However, this description is not intended to be limited to the examples and exemplary embodiments discussed herein, nor to the specific terminology used in such comments, and it is to be understood that each specific element includes all technical equivalents that operate in a similar manner. Definitions
[045] The following terms are defined as follows:
[046] 3D elements; 3D objects - Data that define three-dimensional formats by modeling sonar-derived input or user-determined input.
[047] Abstraction; abstraction layer - A characteristic of executable software, in which different data formats than standardized on a common format so that components are compatible.
[048] Data engine - A collection of modules, according to an embodiment of this invention, that is responsible for at least the acquisition, storage and reporting of data collected during the course of an ROV mission.
[049] Failed state - A state, defined by a user or by a standard, in which the functionality of the system, according to an embodiment of the invention, has been increased to an unacceptable level.
[050] Luminance Threshold - A determined system value of RGB pixel color intensity (Red, Green, Blue) that defines a visible but transparent state for images described by a digital image output device.
[051] Module - A combination of at least one computer processor, computer memory, and custom software that performs one or more defined functions.
[052] Navigation engine - A collection of modules, according to an embodiment of this invention, which is responsible for building the interactive Navigation Interface, and for producing data to display in the Navigation Interface.
[053] Positioned; repositioned, identified - Having a location defined by the Global Positioning System of satellites and/or acoustic or inertial positioning systems; and which optionally has a location defined by a depth below sea level.
[054] A remotely operated vehicle; often a watercraft.
[055] Visualization engine - A collection of modules, according to a modality of this invention, which is responsible for producing the displayed aspect of the navigation interface. System Hardware and Devices
[056] Referring now to the drawings, in which like reference numerals designate identical or corresponding parts throughout the various views, Figure 1A diagrammatically depicts a system according to an embodiment of the invention. This system includes an ROV and its associated instrumentation 1, an operating system housed within the computer hardware 3, and a user interface and its associated devices 2. The operating system 3 mediates interaction between ROV 1 and user 4, so that the user can submit commands and acquire information for ROV 1, and get mechanical responses and data output from ROV 1.
[057] As seen in Figure 1B, operating system 3 can receive live information obtained by real-time 3D sonar from multiple ROV 1 beams, telemetry data, positioning and video data, as well as programmed 3D objects from the database 5, and process the data to provide live 3D models of the environment for both augmented reality and check out full 3D displayed on UI 2. UI 2 can also be used to display video taken using the ROV 1 digital instrumentation, including, for example, cameras and other sensors. The ROV 1 used in the system of the present invention is equipped with conventional instrumentation for telemetry and positioning that are responsive to immediate commands by operating system 3.
[058] In one embodiment of the invention, the hardware for operating system 3 includes a high quality cabinet computer that can be easily integrated with any ROV control system. The various software modules that also define the operating system will be described in more detail below.
[059] Referring to Figures 2A and 2B, the human-machine interface includes at least one monitor 7, and preferably three interactive monitors 7 for navigation. In accordance with an embodiment illustrated in Figure 2A, the central monitor 7 provides a video feed and augmented reality (AR), while the side monitors provide an expansion of the operating field of view. In another aspect, the side monitors can allow the user to have a panoramic view of the ROV environment using full 3D view from the ROV point of view. As seen in Figure 2B, the interaction between the user and the system can use joysticks 8 or gamepads. In another modality, the user interface 2 may employ touch screen technology or multi-tap, audio and audible prompts, voice commands, a computer mouse, etc. Functional Modules
[060] Rather than developing a different operating system 3 for each make and model of ROV 1, the present invention works by abstraction, such that the described operating system 3 and associated hardware work in the same way with all ROVs 1. For example, if one component delivers "$DBS,14.0,10.3" as a depth and header coordinates, and another component delivers "$HD, 15.3,16.4" as a header and depth coordinates, those data strings are analyzed for their respective variables: Depth1=14.0, V2=16.4, Header1=16.4, Header2=15.3. This analysis allows both systems to work in the same way despite the data format details.
[061] By developing a trigger abstraction layer for communication between the operating system 3 and the ROV hardware, user 4 is provided with continuous data communication, and is not restricted to using specific ROV models. This abstraction also allows users 4 and systems 3 to communicate and network information between multiple systems, and share information between multiple subsea projects. The use of a single system also allows for cost reduction in training, maintenance and operation of this system.
[062] Figure 3A depicts a software architecture overview illustrating the component parts of ROV 1, user interface 2 and operating system 3. Software add-ons are provided for ROV telemetry, positioning, and instrumentation. video and sonar. To implement user functions including planning, recording, browsing, supervising and interrogating, operating system 3 provides a navigation engine and a data engine. The operating system 3 is networked so that the connected service and external command units can provide real-time data input. One such external control unit can be configured as a “watchdog”. The external “watchdog” system can perform periodic checks to determine if the system is working properly, or is in a failed state. If the system is in a failed state, the watchdog can switch monitor inputs, or ignore them, to a conventional live video feed until the system is operating correctly.
[063] Figure 3B depicts an additional software architecture overview illustrating that the operating system 3, which mediates the aforementioned user functions, is networked to provide communication between a multi-touch supervision console and a pilot or pilots. Figure 3C illustrates yet another level of connectivity, where a first ROV's navigation system can share all of its dynamic data with another ROV's navigation system over a network. View Engine
[064] As seen from Figures 1B and 3A, the visualization engine of the operating system 3 also includes modules to implement i3, “2D” two-dimensional image, and provide a real-time environment update. Three modules are illustrated in Figure 4, which illustrate in a staggered mode how the system operates to create a hybrid overlay 3D image.
[065] A 3D database module 10 includes advanced 3D rendering technology to allow all stages of the ROV operation to be performed with reference to a visually recreated 3D deepwater environment. This environment is composed of seafloor bathymetry and modeled equipment, eg ocean energy device structures.
[066] As discussed above, the main sources of image data are pre-recorded 3D modeling of sonar data (ie computer generated 3D video) and possibly other video data; the live sonar data and video data taken in real time; the 3D elements determined by the user; and textual and graphical communications intended to be displayed on the user interface screen. The geographic position and depth of any elements or regions included in the image data are known from GPS positioning, the use of acoustic and/or inertial positioning systems, and/or by reference to maps.
[067] In a preferred embodiment of the invention, a virtual video generation module 11 is provided to use previously stored 3D elements or detected 3D elements in real time to create a virtual video of such 3D elements. The virtual video generation module 11 can work in combination with the synchronization module 12.
[068] Synchronization module 12 aligns the position of the virtual camera's virtual video with the angle and position of a real camera in an ROV. According to an embodiment the virtual camera defines a field of view for the virtual video, which may preferably extend between 45 and 144 degrees from a central point of view. As illustrated in Figure 5A, alignment of virtual and real camera angles can be accomplished by calculating the angle between the ROV header and the direction of the camera's field of view; calculate the angle between the vertical of the ROV and the direction of the camera's field of view; and calculate the angle between the ROV and the geographic horizon. These calculated angles are then used to determine an equivalent object screen coordination of the digital X-Y geometry axis at given time intervals or any time of a variable changing value.
[069] An overlay module 13, whose function is further diagrammed in Figure 5B, is provided to overlay the generated virtual video 20 and the synchronized real-time video 21 acquired by the ROV's digital camera. The result is hybrid 3D overlay image 22, where the system effectively extracts the 3D environment generated on top of the non-visible portion of the video feed, thereby greatly increasing visibility for the ROV pilot. Specifically, the overlay software splits the camera feed video and the 3D video generated for the various layers in the z-buffer of the 3D rendering system. This allows for the leveling of layers and their overlapping, which simulates spatial perception and facilitates navigation.
[070] Yet another aspect of the overlay module 13 is that either or both the virtual 20 or real 21 videos can be manipulated, based on the luminance threshold, to be more transparent in areas of lesser interest, thereby allowing the area to match of the other video feed is shown. According to an embodiment of the invention, the luminance in Red-Green-Blue hexadecimal format can be between 0-0-0 and 255-255-255, and preferably between 0-0-0 and 40-40-40. Areas of lesser interest can be selected by a standard system, or by the user. The color intensity of the images in the areas of least interest is adjusted to the luminance threshold, and the corresponding region of the other video is adjusted to normal luminance. For the example illustrated in Figure 5B, the background of the virtual video 20 is kept relatively more transparent than the foreground. Thus, when the real video 21 is superimposed on the virtual 3D image 20, the real video 21 is selectively enhanced mainly with the virtual foreground, which contains an underwater structure of interest. Navigation Engine
[071] The on-screen 2D Navigation Interface for the ROV pilot involves overlapping geographically positioned data or technical information in the 2D rendering system. The geographic positioning or geographic identification of data and elements is performed by reference to maps or global positioning satellites. The resulting Navigation Interface, as seen in Figures 6A through 6D, is reminiscent of display consoles that lead the way in aviation. In the case of underwater navigation, the display is configured to indicate ROV 1 position based on known coordinates, and by the use of a sonar system that records 3D images from the position of an ROV for later navigation. In this way, the present invention provides immersion visualization of ROV operation.
[072] Figure 6A illustrates the overlay of textual information and symbols 30 in the ROV user interface 2D video rendering. Figure 6B illustrates the overlay of 3D elements in video rendering. Overlaying this data on the video feed is useful not only for ROV 1 navigation and control, but also for performing the planned planning and supervision functions of the operating system 3. This overlay can be performed in a similar manner to the overlay of the video feeds, that is, by getting the screen coordinates of an object, and rendering text and numbers close to those coordinates.
[073] The planning module enables engineers and/or supervisors to plan one or several ROV missions. Referring again to Figure 6A, an important aspect of the planning module is the entry and presentation of bathymetry information 32 through 3D visualization. As seen in the Navigation Interface, waypoints 34 are superimposed on the video feed. These elements can be identified, for example, by number, and/or distance from a reference point. In other words, in addition to overlaying technical specifications and status information 30, for ROV 1 or other relevant structures, the Navigation Interface also provides GPS-determined positions for navigation and pilot information.
[074] In another aspect of the invention, procedures 35, including timed procedures (fixed position observation tasks, for example), can be included in the Navigation interface as text. Given this procedural information, an ROV pilot can anticipate and complete tasks with greater accuracy. A user can also use the system to define actionable areas. Actionable areas are geographically positioned areas in the subsea environment that trigger a system action when entering, exiting, or staying longer than a designated time. The triggered action can be an alarm, notification, change of procedure, change of task, etc. Referring to Figure 6C, using a series of rules established in the planning module, or by manual entry, the system can show more or less 2D geographically identified information in the Navigation interface. For example, as seen in 36, during an ROV operation when the pilot is within 100 meters of a geographically identified object, the system can only show general information related to that general structure, or specific information needed for a specific current task in the nearby area. As the pilot approaches the geographically identified structure, illustrated at 37, the system can include gradually illustrating more information overlaying the components of that structure. This dynamic, manual level of detail control can apply to both textual and symbolic information 30 as well as augmenting 3D elements 31.
[075] With reference to Figure 6D, the planning module can also provide on-screen information related to flight path 38. As seen in Figure 6E, another important aspect of the invention is incorporated by a mini-map 39, that is, a graph superimposed on the video, which can include several different representations, such as small icons representing the target objects. The mini map 39 can show cardinal points (North, South, East, West) in a 3D representation, optionally in addition to a representation of a relevant object in three-dimensional space. The mini map 39 can be placed in a corner, and can be moved, discarded and recalled by the user. Data Engine
[076] The data engine, which mediates the data storage and data transfer functions of the invention, therefore, incorporates the logging and supervision modules.
[077] The registration module records or records all information made available by the operating system and saves such data in a central database for future access. B Available information may include any or all telemetry data, sonar data, 3D models, bathymetry, waypoints, checkpoints, alarms or malfunctions, procedures, operations, and navigation records such as flight path information , positioning and inertia data, etc.
[078] An essential part of any offshore operation that provides critical customer data after the operation is completed. After the operation, during the interrogation and reporting stage, the interrogation and reporting module can provide a full 3D scene or reproduction of the operation. The interrogation and reporting module can provide a report of planned flight path versus actual flight path, waypoints, checkpoints, various deviations from the plan, alarms provided by the ROV, including details of alarm type, time and location , procedures, checkpoints, etc. to be released to the customer. Therefore, the operating system is configured to provide interactive four-dimensional reports (three spatial dimensions plus time) for each operation. This enables quick analysis and a comprehensive understanding of operations.
[079] Yet another software element that interacts with the Navigation interface is the supervisor module. The execution of the supervisor module allows one or more supervisors to see and/or use the Navigation interface, and, by extension, any ROV 1 being controlled from the interface. These supervisors do not need to share the location of the pilot or ROV pilots, but can instead employ the connectivity elements described in Figures 3B and 3C. A plurality of multi-touch supervision consoles can be used in different locations. For example, one might have nine monitors connected to three example hardware structures, including an ROV 1, where only one operating system 3 gathered ROV data and shared information with others. Alternatively, between one and 12 network monitors can be used, and preferably between 3 and 9 can be used.
[080] The network provided as shown in Figures 3B and 3C can reduce risks, such as human error, in multiple ROV operations, even those coordinated from separate vessels. The network through the supervisor module allows the sharing of information between ROV systems, personnel and operations throughout the entire operation workflow.
[081] Therefore, a system and method related to the navigation and control of ROVs was illustrated and described. The method and system are not limited to any specific hardware or software configuration. The many variations, modifications, and alternative applications of the invention that may be clear to those skilled in the art, and that do not depart from the scope of the invention, are considered covered by invention.
权利要求:
Claims (43)
[0001]
1. System for exploration, CHARACTERIZED by the fact that it comprises: - a remotely operated vehicle (ROV), wherein said ROV comprises a video camera to acquire a live video feeder; - a network operating system that additionally comprises a display mechanism, wherein said display mechanism comprises: - a database module of 3D elements representing objects arranged in an operating environment of said remotely operated vehicle; - a virtual video generation module for generating a virtual video feed incorporating said 3D elements; - a synchronization module for synchronizing an angle and position of a virtual camera with an angle and position of the video camera, wherein the virtual camera defines a field of view for the virtual video feed, wherein the field of view is preferably around 45 and 144 degrees from a central point of view; and - an overlay module for overlaying said virtual video feed and said live video feed, wherein said overlay module is configured to modulate the transparency or opacity of a region of lesser interest in one of the virtual video feeds or live so that a corresponding region of the other video feed is more visible, said superimposed live and virtual video feeds comprising hybrid 3D image; and - a navigation interface configured to display said hybrid 3D image, said navigation interface comprising at least one networked monitor.
[0002]
2. System according to claim 1, CHARACTERIZED by the fact that the overlay module is configured to overlay graphic information, textual information or both, on the hybrid 3D image.
[0003]
3. System according to claim 1, CHARACTERIZED by the fact that said overlap is based on a luminance threshold in Red-Green-Blue hexadecimal format, in which the luminance is adjusted to values between 0-0-0 and 255 -255-255, and preferably between 0-0-0 and 40-40-40.
[0004]
4. System according to any one of claims 1, CHARACTERIZED by the fact that the operating system is capable of sharing data with a plurality of remote monitors, said plurality comprising up to 12 remote monitors, and preferably between 3 and 9 remote monitors.
[0005]
A system according to any one of claims 1, CHARACTERIZED by further comprising an external system configured to determine whether the system is working or is in a fault state.
[0006]
6. System according to claim 5, CHARACTERIZED by the fact that the external system is configured to switch the monitor input from the hybrid 3D image to said live video feeder if the system is in a fault state.
[0007]
7. System according to any one of claims 1, CHARACTERIZED by the fact that the navigation interface comprises at least three monitors, wherein said monitors are arranged adjacent to each other so that the intermediate monitor displays virtual video feeds and live overlays, while the two side monitors display an expanded view of an operating field.
[0008]
8. System according to any one of claims 1, CHARACTERIZED by the fact that 3D elements are obtained from pre-recorded sonar data, live sonar data, programmed 3D objects, user-determined inputs, and bathymetry .
[0009]
9. Method of exploration, CHARACTERIZED by the fact that it comprises: - storing 3D elements in a database module, said 3D elements representing objects arranged in the operating environment of the remotely operated vehicle; - generate a virtual video feed of said 3D elements; - get a live video feed using a video camera; - synchronize an angle and position of a virtual camera with an angle and position of the video camera mounted on the remotely operated vehicle; where the virtual camera defines a field of view for the virtual video feed; and - overlaying said virtual video feed and said live video feed on a video screen, wherein the overlay further comprises the step of modulating the transparency or opacity of a region of lesser interest in one of the virtual video feeds or live so that a corresponding region of the other video feed is more visible, said overlaid virtual and live video feeds comprising hybrid 3D image.
[0010]
The method of claim 9, characterized in that it further comprises superimposing graphical information, textual information, or both, on the hybrid 3D image.
[0011]
11. Method according to claim 9, CHARACTERIZED by the fact that the step of modulating transparency or opacity further comprises establishing a luminance threshold.
[0012]
12. Method according to claim 11, CHARACTERIZED by the fact that establishing the luminance threshold comprises maintaining a background of the virtual video feed at a higher transparency than the background of the real video feed.
[0013]
13. Method according to claim 9, CHARACTERIZED by the fact that synchronizing the angle and position of the video camera comprises the steps of: - calculating an angle between a vehicle header and a direction of a video camera field of view ; - calculate an angle between a vehicle's vertical orientation and the direction of the video camera's field of view; and - calculate an angle between the vehicle and a geographic horizon.
[0014]
14. Method according to any one of claims 9, CHARACTERIZED by the fact that the overlay step further comprises modulating the amount of 2D or 3D information overlaid on the video screen based on the position of the ROV with respect to objects arranged in the environment operation of the remotely operated vehicle.
[0015]
15. Method according to any one of claims 9, CHARACTERIZED by the fact that the overlay step further comprises providing a mini map, said mini map defining a computer generated graphic illustrating either or both cardinal points or a position of an object in 3D.
[0016]
16. Method according to any one of claims 9, CHARACTERIZED by the fact that it comprises the step of configuring the operating system so that the entry, exit, or stay longer than a designated time in a position determined by GPS trigger any one or combination of an alarm, notification, procedure change, or task change.
[0017]
17. The method according to claim 9, CHARACTERIZED by the fact that the 3D elements are obtained from any pre-recorded sonar data, live sonar data, programmed 3D objects, user-determined inputs, and bathymetry.
[0018]
18. Computer readable medium, including code for exploration, code CHARACTERIZED by the fact that when executed is operable to: store the 3D elements in a database module, said 3D elements represent objects arranged in the operating environment of the remotely operated vehicle; generating a virtual video feed of said 3D elements; get a live video feed using a video camera; synchronizing an angle and position of a virtual camera with an angle and position of the remotely operated vehicle mounted video camera, wherein the virtual camera defines a field of view for the virtual video feed; and overlaying said virtual video feed and said live video feed onto a video screen, wherein the overlay further comprises the step of modulating the transparency or opacity of a region of lesser interest in one of the virtual video feeds or the such that a corresponding region of the other video feed is more visible, said superimposed live and virtual video feeds comprise hybrid 3D images.
[0019]
19. A computer readable medium according to claim 18, CHARACTERIZED by the fact that the code for synchronizing the angle and position of the video camera is additionally operable to: calculate an angle between a vehicle header and a direction of a field of video camera view; calculate an angle between a vehicle's vertical orientation and the video camera's field of view direction; and calculate an angle between the vehicle and a geographic horizon.
[0020]
20. Computer readable medium according to claim 18, CHARACTERIZED by the fact that the 3D elements are obtained from any of the pre-recorded sonar data, live sonar data, programmed 3D objects, user-determined inputs and bathymetry.
[0021]
21. Exploitation system CHARACTERIZED by the fact that it comprises: a remote operated vehicle (ROV), wherein said ROV comprises a video camera for the acquisition of a live video feed; a network operating system comprising a display mechanism, wherein said display mechanism further comprises: a database module of 3D elements representing objects arranged in an operating environment of said remotely operated vehicle; a virtual video generation module for generating a virtual video feed incorporating said 3D elements; and an overlay module for overlaying said virtual video feed and said live video feed, said overlay virtual and live video feed comprising hybrid 3D images; and a navigation interface configured to display said hybrid 3D images.
[0022]
22. System according to claim 21, CHARACTERIZED by the fact that the overlay module is configured to overlay graphical information, textual information, or both, onto the hybrid 3D images.
[0023]
23. System according to claim 21, CHARACTERIZED by the fact that said overlap is based on a luminance threshold in the Red-Green-Blue hexadecimal format, in which the luminance is adjusted to values between 0-0-0 and 255 -255-255, and preferably between 0-0 -0 and 40-40-40.
[0024]
24. System according to claim 21, CHARACTERIZED by the fact that the operating system is capable of sharing data with a plurality of remote monitors, said plurality of up to 12 remote monitors, and preferably between 3 and 9 remote monitors.
[0025]
The system of claim 21, characterized in that it further comprises an external system configured to determine whether the system is operating or is in a fail state.
[0026]
26. System according to claim 25, CHARACTERIZED by the fact that the external system is configured to switch the hybrid 3D image monitor input to said live video feed, if the system is in a fault state.
[0027]
27. System according to claim 21, CHARACTERIZED by the fact that the navigation interface comprises at least three monitors connected in a network, wherein said monitors are arranged adjacent to each other in such a way that the middle monitor displays power supply virtual and live video overlaid, while both side monitors display an expanded view of an operating field.
[0028]
28. The system according to claim 21, CHARACTERIZED by the fact that the 3D elements are obtained from any pre-recorded sonar data, live sonar data, programmed 3D objects, user-determined inputs, and bathymetry.
[0029]
29. The system according to claim 21, CHARACTERIZED by the fact that said overlay module is configured to modulate the transparency or opacity of a region of lesser interest in one of the live or virtual video feeds in such a way that a region corresponding from the other video feed is more visible.
[0030]
30. Method for exploration, CHARACTERIZED by the fact that it comprises: storing the 3D elements in a database module, said 3D elements representing objects arranged in the operating environment of the remotely operated vehicle; generating a virtual video feed of said 3D elements; get a live video feed using a video camera; and superimposing said virtual video feed and said live video feed on a video screen, said superimposed virtual and live video feed comprising hybrid 3D images.
[0031]
31. Method according to claim 30, CHARACTERIZED in that it additionally comprises the overlay of graphical information, textual information, or both, on the hybrid 3D images.
[0032]
32. Method according to claim 30, CHARACTERIZED in that the step of modulating transparency or opacity further comprises establishing a luminance threshold.
[0033]
33. The method according to claim 32, CHARACTERIZED by the fact that establishing the luminance threshold, comprises maintaining a background of the virtual video feed at a greater transparency than the background of the live video feed.
[0034]
34. Method according to claim 30, CHARACTERIZED by the fact that the synchronization of the angle and position of the video camera comprises the steps of: calculating an angle between a vehicle header and a direction of a field of view of the camera. video; calculate an angle between a vehicle's vertical orientation and the direction of the video camera's field of view; and calculate an angle between the vehicle and a geographic horizon.
[0035]
35. Method according to claim 30, CHARACTERIZED in that the overlay step further comprises modulating the amount of 2D or 3D information overlaid on the video screen based on the position of the ROV relative to objects arranged in the operating environment of the operated vehicle remotely.
[0036]
36. The method according to claim 30, CHARACTERIZED in that the overlay step further comprises providing a mini map, said mini map defining a computer generated graphic showing one or both cardinal points or a position of an object in 3D .
[0037]
37. Method according to claim 30, CHARACTERIZED by the fact that comprising the step of configuring the operating system in such a way that entering, exiting or staying for longer than a time assigned to a position determined by the GPS triggers either a combination of an alarm, notification, procedure change, or task change.
[0038]
38. The method of claim 30, CHARACTERIZED in that the 3D elements are obtained from any of the pre-recorded sonar data, live sonar data, programmed 3D objects, user-determined inputs, and the bathymetry .
[0039]
39. The method according to claim 30, CHARACTERIZED in that the overlay further comprises the step of modulating the transparency or opacity of a region of lesser interest in one of the live or virtual video feeds such that a corresponding region of the other video feed is more visible.
[0040]
40. Computer readable medium, including code for exploration, the code when executed is CHARACTERIZED by the fact that it is operable to: store 3D elements in a database module, said 3D elements representing objects arranged in the operating environment from the remotely operated vehicle; generating a virtual video feed of said 3D elements; get live video using a video camera; and superimposing said virtual video feed and said live video feed on a video screen, said superimposed virtual and live video feeds comprising hybrid 3D images.
[0041]
41. A computer readable medium according to claim 40, CHARACTERIZED by the fact that the code for synchronizing the angle and position of the video camera is additionally operable to: calculate an angle between a vehicle header and a direction of a field of video camera view; calculate an angle between a vehicle's vertical orientation and the direction of the video camera's field of view; and calculate an angle between the vehicle and a geographic horizon.
[0042]
42. Computer readable medium according to claim 40, CHARACTERIZED by the fact that the 3D elements are obtained from any of the pre-recorded sonar data, live sonar data, programmed 3D objects, user-determined inputs and bathymetry.
[0043]
43. Computer readable medium according to claim 40 CHARACTERIZED by the fact that the overlay additionally comprises the step of modulating the transparency or opacity of a region of lesser interest in one of the virtual or live video feeds such that a region corresponding from the other video feed is more visible.
类似技术:
公开号 | 公开日 | 专利标题
BR112014011172B1|2021-07-27|SYSTEM AND METHOD OF OPERATION FOR REMOTELY OPERATED VEHICLES WITH OVERLAPPED 3D IMAGE
US10482659B2|2019-11-19|System and method for superimposing spatially correlated data over live real-world images
US20210304430A1|2021-09-30|System and method of operation for remotely operated vehicles for simultaneous localization and mapping
US20210295605A1|2021-09-23|System and method of operation for remotely operated vehicles with improved position estimation
US20210309331A1|2021-10-07|System and method of operation for remotely operated vehicles leveraging synthetic data to train machine learning models
US20210366097A1|2021-11-25|System and method of operation for remotely operated vehicles for automatic detection of structure integrity threats
AU2015345061A1|2017-06-22|A method of controlling a subsea platform, a system and a computer program product
US10943491B2|2021-03-09|Method of synthetic visualization of a scene viewed from an aircraft and to a synthetic vision system for implementing such a method
US20210312712A1|2021-10-07|Geo-augmented field excursion for geological sites
同族专利:
公开号 | 公开日
AU2012335330B2|2017-07-20|
CA2854595A1|2013-05-16|
ES2895454T3|2022-02-21|
AU2017236009A1|2017-10-26|
US9195231B2|2015-11-24|
US10424119B2|2019-09-24|
US20170323487A1|2017-11-09|
US20160063768A1|2016-03-03|
WO2013068821A3|2014-04-24|
US20140316611A1|2014-10-23|
AU2012335330A1|2014-06-05|
WO2013068821A2|2013-05-16|
AU2019264604A1|2019-12-05|
EP2777024A2|2014-09-17|
US9741173B2|2017-08-22|
PT2777024T|2021-11-08|
AU2017236009B2|2019-08-22|
HRP20211664T1|2022-02-18|
SG11201402114PA|2014-06-27|
EP2777024B1|2021-08-04|
BR112014011172A2|2017-05-09|
AU2019264604B2|2020-05-21|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US5412569A|1994-03-29|1995-05-02|General Electric Company|Augmented reality maintenance system with archive and comparison device|
US5706195A|1995-09-05|1998-01-06|General Electric Company|Augmented reality maintenance system for multiple rovs|
US6774966B1|1997-06-10|2004-08-10|Lg.Philips Lcd Co., Ltd.|Liquid crystal display with wide viewing angle and method for making it|
JPH11139390A|1997-11-10|1999-05-25|Ishikawajima Harima Heavy Ind Co Ltd|Submerged object searching device|
US6330356B1|1999-09-29|2001-12-11|Rockwell Science Center Llc|Dynamic visual registration of a 3-D object with a graphical model|
AT540347T|2000-05-01|2012-01-15|Irobot Corp|METHOD AND DEVICE FOR CONTROLLING A MOBILE ROBOT|
US8015507B2|2001-11-05|2011-09-06|H2Eye Limited|Graphical user interface for a remote operated vehicle|
US7071970B2|2003-03-10|2006-07-04|Charles Benton|Video augmented orientation sensor|
US7292257B2|2004-06-28|2007-11-06|Microsoft Corporation|Interactive viewpoint video system and process|
WO2006011153A2|2004-07-30|2006-02-02|Extreme Reality Ltd.|A system and method for 3d space-dimension based image processing|
US8577538B2|2006-07-14|2013-11-05|Irobot Corporation|Method and system for controlling a remote vehicle|
US7352292B2|2006-01-20|2008-04-01|Keith Alter|Real-time, three-dimensional synthetic vision display of sensor-validated terrain data|
US7693617B2|2006-09-19|2010-04-06|The Boeing Company|Aircraft precision approach control|
US8301318B2|2008-03-05|2012-10-30|Robotic Research Llc|Robotic vehicle remote control system having a virtual operator environment|
US8423292B2|2008-08-19|2013-04-16|Tomtom International B.V.|Navigation device with camera-info|
US20100091036A1|2008-10-10|2010-04-15|Honeywell International Inc.|Method and System for Integrating Virtual Entities Within Live Video|
GB2464985A|2008-11-03|2010-05-05|Wireless Fibre Systems Ltd|Underwater Vehicle Guidance|
FR2949167B1|2009-08-11|2016-07-01|Alain Dinis|SYSTEM AND METHOD FOR REAL-TIME VIRTUAL DIVING|
US20110153189A1|2009-12-17|2011-06-23|Garmin Ltd.|Historical traffic data compression|
US8725273B2|2010-02-17|2014-05-13|Irobot Corporation|Situational awareness for teleoperation of a remote vehicle|
SG11201402114PA|2011-11-09|2014-06-27|Abyssal S A|System and method of operation for remotely operated vehicles with superimposed 3d imagery|US9043483B2|2008-03-17|2015-05-26|International Business Machines Corporation|View selection in a vehicle-to-vehicle network|
US9123241B2|2008-03-17|2015-09-01|International Business Machines Corporation|Guided video feed selection in a vehicle-to-vehicle network|
SG11201402114PA|2011-11-09|2014-06-27|Abyssal S A|System and method of operation for remotely operated vehicles with superimposed 3d imagery|
US9754507B1|2013-07-02|2017-09-05|Rockwell Collins, Inc.|Virtual/live hybrid behavior to mitigate range and behavior constraints|
US11181637B2|2014-09-02|2021-11-23|FLIR Belgium BVBA|Three dimensional target selection systems and methods|
US10802141B2|2014-05-30|2020-10-13|FLIR Belgium BVBA|Water temperature overlay systems and methods|
GB201405527D0|2014-03-27|2014-05-14|Mill Facility The Ltd|A driveable vehicle unit|
DE102014107211A1|2014-05-22|2015-11-26|Atlas Elektronik Gmbh|Device for displaying a virtual reality as well as measuring device|
WO2016073060A2|2014-09-02|2016-05-12|Flir Systems, Inc.|Augmented reality sonar imagery systems and methods|
US10444349B2|2014-09-02|2019-10-15|FLIR Belgium BVBA|Waypoint sharing systems and methods|
NL2013804B1|2014-11-14|2016-10-07|Fugro Subsea Services Ltd|A method of controlling a subsea platform, a system and a computer program product.|
US20160259051A1|2015-03-05|2016-09-08|Navico Holding As|Systems and associated methods for producing a 3d sonar image|
JP6317854B2|2015-03-30|2018-04-25|株式会社カプコン|Virtual three-dimensional space generation method, video system, control method thereof, and recording medium readable by computer device|
FR3037429B1|2015-06-15|2018-09-07|Donecle|SYSTEM AND METHOD FOR AUTOMATIC SURFACE INSPECTION|
US20170094227A1|2015-09-25|2017-03-30|Northrop Grumman Systems Corporation|Three-dimensional spatial-awareness vision system|
CA2948761A1|2015-11-23|2017-05-23|Wal-Mart Stores, Inc.|Virtual training system|
CN105679169A|2015-12-31|2016-06-15|中国神华能源股份有限公司|Railway electronic sand table system|
EP3475778A4|2016-06-28|2019-12-18|Cognata Ltd.|Realistic 3d virtual world creation and simulation for training automated driving systems|
GB2557572B|2016-09-20|2020-02-12|Subsea 7 Ltd|Performing remote inspection at subsea locations|
IL251189D0|2017-03-15|2017-06-29|Ophir Yoav|Gradual transitioning between two-dimensional and theree-dimensional augmented reality images|
US20200386552A1|2017-07-18|2020-12-10|Bechtel Oil, Gas & Chemicals, Inc.|Primary navigation system for tugboats with an obstructed pilot view|
GB2565101A|2017-08-01|2019-02-06|Lobster Pictures Ltd|Method for integrating photographic images and BIM data|
US11010975B1|2018-03-06|2021-05-18|Velan Studios, Inc.|Remote camera augmented reality system|
WO2020030949A1|2018-08-08|2020-02-13|Abyssal S.A.|System and method of operation for remotely operated vehicles for automatic detection of structure integrity threats|
SG11202100945WA|2018-08-08|2021-02-25|Abyssal S A|System and method of operation for remotely operated vehicles leveraging synthetic data to train machine learning models|
GB2578592A|2018-10-31|2020-05-20|Sony Interactive Entertainment Inc|Apparatus and method of video playback|
EP3712558A1|2019-03-21|2020-09-23|Rovco Limited|Surveying system for surveying in harsh environments|
US11228737B2|2019-07-31|2022-01-18|Ricoh Company, Ltd.|Output control apparatus, display terminal, remote control system, control method, and non-transitory computer-readable medium|
法律状态:
2018-12-04| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2019-11-26| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2021-06-01| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2021-06-29| B09W| Correction of the decision to grant [chapter 9.1.4 patent gazette]|Free format text: O PRESENTE PEDIDO TEVE UM PARECER DE DEFERIMENTO NOTIFICADO NA RPI NO 2630 DE01/06/2021. ATRAVES DA MENSAGEM FALE CONOSCO 936994, A REQUERENTE SOLICITA CORRIGIR ERROS DETRADUCAO NO QUADRO REIVINDICATORIO DA PETICAO DE DEFERIMENTO. AS CORRECOES QUE DEVERAOCOMPOR A CARTA-PATENTE SAO APRESENTADAS NA PETICAO 870210053585 DE 15/06/2021. DIANTEDISTO, CONCLUO PELA RETIFICACAO DO PARECER DE DEFERIMENTO, DEVENDO INTEGRAR A CARTA PATENTEOS DOCUMENTOS QUE CONSTAM NO QUADRO 1 DESTE PARECER. |
2021-07-13| B350| Update of information on the portal [chapter 15.35 patent gazette]|
2021-07-27| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 08/11/2012, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
PT10598911|2011-11-09|
PTPPP105989|2011-11-09|
US201261681411P| true| 2012-08-09|2012-08-09|
US61/681,411|2012-08-09|
PCT/IB2012/002281|WO2013068821A2|2011-11-09|2012-11-08|System and method of operation for remotely operated vehicles with superimposed 3d imagery|
[返回顶部]